Lower Rank Approximation of Matrices by Least Squares with any Choice of Weights
نویسندگان
چکیده
Reduced rank approximation of matrices has hitherto been possible only by unweighted least squares. This paper presents iterative techniques for obtaining such approximations when weights are introduced. The techniques involve criss-cross regressions with careful initialization. Possible applications of the approximation are in modelling, biplotting, contingency table analysis, fitting of missing values, checking outliers, etc.
منابع مشابه
Maximum Likelihood in Generalized Fixed Score Factor Analysis
We study the weighted least squares fixed rank approximation problem in which the weight matrices depend on unknown parameters. The classical example is fixed score factor analysis (FSFA), where the weights depend on the unknown uniquenesses.
متن کاملTheory of block-pulse functions in numerical solution of Fredholm integral equations of the second kind
Recently, the block-pulse functions (BPFs) are used in solving electromagnetic scattering problem, which are modeled as linear Fredholm integral equations (FIEs) of the second kind. But the theoretical aspect of this method has not fully investigated yet. In this article, in addition to presenting a new approach for solving FIE of the second kind, the theory of both methods is investigated as a...
متن کاملThe Approximation of One Matrix by Another of Lower Rank Carl Eckart and Gale Young
The mathematical problem of approximating one matrix by another of lower rank is closely related to the fundamental postulate of factor-theory. When formulated as a least-squares problem, the normal equations cannot be immediately written do~vn, since the elements of the approximate matrix are not independent of one another. The solution of the problem is simplified by first expressing the matr...
متن کاملFast Implementation of Low Rank Approximation of a Sylvester Matrix
In [16], authors described an algorithm based on Structured Total Least Norm (STLN) for constructing a Sylvester matrix of given lower rank and obtaining the nearest perturbed polynomials with exact GCD of given degree. For their algorithm, the overall computation time depends on solving a sequence least squares (LS) problems. In this paper, a fast implementation for solving these LS problems i...
متن کاملOn the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors
Tensor decomposition has important applications in various disciplines, but it remains an extremely challenging task even to this date. A slightly more manageable endeavor has been to find a low rank approximation in place of the decomposition. Even for this less stringent undertaking, it is an established fact that tensors beyond matrices can fail to have best low rank approximations, with the...
متن کامل